Derivation Tree Analysis for Accelerated Fixed-Point Computation
نویسندگان
چکیده
We show that for several classes of idempotent semirings the least fixed-point of a polynomial system of equations X = f (X) is equal to the least fixed-point of a linear system obtained by “linearizing” the polynomials of f in a certain way. Our proofs rely on derivation tree analysis, a proof principle that combines methods from algebra, calculus, and formal language theory, and was first used in [5] to show that Newton’s method over commutative and idempotent semirings converges in a linear number of steps. Our results lead to efficient generic algorithms for computing the least fixed-point. We use these algorithms to derive several consequences, including an O(N) algorithm for computing the throughput of a context-free grammar (obtained by speeding up the O(N) algorithm of [2]), and a generalization of Courcelle’s result stating that the downwardclosed image of a context-free language is regular [3].
منابع مشابه
Perturbations of Jordan higher derivations in Banach ternary algebras : An alternative fixed point approach
Using fixed pointmethods, we investigate approximately higher ternary Jordan derivations in Banach ternaty algebras via the Cauchy functional equation$$f(lambda_{1}x+lambda_{2}y+lambda_3z)=lambda_1f(x)+lambda_2f(y)+lambda_3f(z)~.$$
متن کاملSolving Fixed-Point Equations by Derivation Tree Analysis
Systems of equations over ω-continuous semirings can be mapped to context-free grammars in a natural way. We show how an analysis of the derivation trees of the grammar yields new algorithms for approximating and even computing exactly the least solution of the system.
متن کاملAn Action Computation Tree Logic With Unless Operator
This paper is about action computation tree logic (ACTL), a propositional branching-time temporal logic very suitable for specifying properties of concurrent systems described with processes. A new variant of ACTL is introduced, which is based on temporal operators until and unless, whereas all other temporal operators are derived from them. A fixed point characterisation usable for global mode...
متن کاملA Fixed-Point of View on Gradient Methods for Big Data
Interpreting gradient methods as fixed-point iterations, we provide a detailed analysis of those methods for minimizing convex objective functions. Due to their conceptual and algorithmic simplicity, gradient methods are widely used in machine learning for massive data sets (big data). In particular, stochastic gradient methods are considered the de-facto standard for training deep neural netwo...
متن کاملApproximately higher Hilbert $C^*$-module derivations
We show that higher derivations on a Hilbert$C^{*}-$module associated with the Cauchy functional equation satisfying generalized Hyers--Ulam stability.
متن کامل